Goto

Collaborating Authors

 functional differential equation


Physics-informed Neural Networks for Functional Differential Equations: Cylindrical Approximation and Its Convergence Guarantees

Neural Information Processing Systems

We propose the first learning scheme for functional differential equations (FDEs).FDEs play a fundamental role in physics, mathematics, and optimal control.However, the numerical analysis of FDEs has faced challenges due to its unrealistic computational costs and has been a long standing problem over decades.Thus, numerical approximations of FDEs have been developed, but they often oversimplify the solutions. To tackle these two issues, we propose a hybrid approach combining physics-informed neural networks (PINNs) with the *cylindrical approximation*. The cylindrical approximation expands functions and functional derivatives with an orthonormal basis and transforms FDEs into high-dimensional PDEs. To validate the reliability of the cylindrical approximation for FDE applications, we prove the convergence theorems of approximated functional derivatives and solutions.Then, the derived high-dimensional PDEs are numerically solved with PINNs.Through the capabilities of PINNs, our approach can handle a broader class of functional derivatives more efficiently than conventional discretization-based methods, improving the scalability of the cylindrical approximation.As a proof of concept, we conduct experiments on two FDEs and demonstrate that our model can successfully achieve typical L 1 relative error orders of PINNs \sim 10 {-3} .Overall, our work provides a strong backbone for physicists, mathematicians, and machine learning experts to analyze previously challenging FDEs, thereby democratizing their numerical analysis, which has received limited attention.


Asynchronous Dynamics of Continuous Time Neural Networks

Wang, Xin, Li, Qingnan, Blum, Edward K.

Neural Information Processing Systems

Motivated by mathematical modeling, analog implementation and distributed simulation of neural networks, we present a definition of asynchronous dynamics of general CT dynamical systems defined by ordinary differential equations, based on notions of local times and communication times. We provide some preliminary results on globally asymptotical convergence of asynchronous dynamics for contractive and monotone CT dynamical systems. When applying theresults to neural networks, we obtain some conditions that ensure additive-type neural networks to be asynchronizable.


Asynchronous Dynamics of Continuous Time Neural Networks

Wang, Xin, Li, Qingnan, Blum, Edward K.

Neural Information Processing Systems

Motivated by mathematical modeling, analog implementation and distributed simulation of neural networks, we present a definition of asynchronous dynamics of general CT dynamical systems defined by ordinary differential equations, based on notions of local times and communication times. We provide some preliminary results on globally asymptotical convergence of asynchronous dynamics for contractive and monotone CT dynamical systems. When applying the results to neural networks, we obtain some conditions that ensure additive-type neural networks to be asynchronizable.


Asynchronous Dynamics of Continuous Time Neural Networks

Wang, Xin, Li, Qingnan, Blum, Edward K.

Neural Information Processing Systems

Motivated by mathematical modeling, analog implementation and distributed simulation of neural networks, we present a definition of asynchronous dynamics of general CT dynamical systems defined by ordinary differential equations, based on notions of local times and communication times. We provide some preliminary results on globally asymptotical convergence of asynchronous dynamics for contractive and monotone CT dynamical systems. When applying the results to neural networks, we obtain some conditions that ensure additive-type neural networks to be asynchronizable.